Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Channel estimation algorithm based on cell reference signal in LTE-A system
LI Huimin, ZHANG Zhizhong, LI Linxiao
Journal of Computer Applications    2018, 38 (7): 2009-2014.   DOI: 10.11772/j.issn.1001-9081.2017123054
Abstract574)      PDF (887KB)(251)       Save
Interpolation algorithms are usually used to estimate the channel frequency response value at the data location in Long Term Evolution-Advanced (LTE-A) system. Concerning the problem that traditional Linear Minimum Mean Square Error (LMMSE) algorithm needs to obtain channel statistical properties in advance and it suffers from a high computational complexity due to an inversion matrix operation, an improved LMMSE channel estimation interpolation algorithm was proposed. Firstly, the pilots were interpolated to add virtual pilots, which improved the performance of the algorithm. Secondly, an approximate estimation method of autocorrelation matrix and Signal-to-Noise Ratio (SNR) was given by using the fact that channel energy in the time domain is more concentrated. Finally, a sliding window method was adopted to further simplify the algorithm complexity to complete the LMMSE interpolation in frequency domain. The simulation results show that the overall performance of the proposed algorithm is better than that of linear interpolation method and Discrete Fourier Transform (DFT) interpolation method, and it has similar Bit Error Rate (BER) and Mean Squared Error (MSE) performance with the traditional LMMSE interpolation algorithm. Furthermore, it reduces the complexity by 98.67% compared with traditional LMMSE estimator without degrading the overall BER and MSE performance, so it is suitable for practical engineering applications.
Reference | Related Articles | Metrics
Simulation and implementation of physical random access channel signal detection in long term evolution advanced system
ZHANG Yajing, LIU Yulin, ZHANG Zhizhong
Journal of Computer Applications    2018, 38 (5): 1442-1446.   DOI: 10.11772/j.issn.1001-9081.2017102600
Abstract635)      PDF (805KB)(416)       Save
According to the influence of Doppler frequency shift on the detection of Physical Random Access Channel (PRACH) signals, the signal detection algorithms were were divided into medium speed, high speed, ultra-high speed modes, and they were improved respectively. In the medium speed mode, a preamble detection algorithm based on frequency offset correction was proposed. In the high speed mode, an multi-sliding window peak detection algorithm was proposed. In the ultra-high speed mode, a frequency offset compensation preamble detection algorithm based on integer subcarrier was proposed. The simulation results show that, in different scenarios, when the PRACH signals are transmitted through the Additive White Gaussian Noise (AWGN) channel, and the false alarm rate performance of the receiver is improved by at least 3.8 dB, when the PRACH signals are transmitted through the Extend Typical Urban model (ETU) channel, the false alarm rate performance is improved by at least 1 dB. Compared with the frequency domain correlation detection algorithm, the proposed algorithms can improve the probability of successful detection of preamble signals and reduce the random access delay.
Reference | Related Articles | Metrics
Design and implementation of carrier aggregation in LTE-A air-interface analyzer
LI Ruying, ZHANG Zhizhong, DENG Xiangtian
Journal of Computer Applications    2018, 38 (3): 786-790.   DOI: 10.11772/j.issn.1001-9081.2017081988
Abstract685)      PDF (765KB)(532)       Save
Focusing on the difficulties in communication networks test and optimization since some key technologies like carrier aggregation applied in communication networks, and the shortage of Long Term Evolution (LTE) air-interface analyzer in domestic market, a Long Term Evolution-Advanced (LTE-A) air-interface analyzer design scheme was proposed, which supports 3GPP R10/11 protocol standards and LET-A key technologies like carrier aggregation. Firstly, the physical and logical architecture of LTE-A air-interface analyzer was introduced, and the relationship between the two, the function of each module in the physical and logical architecture was illustrated, then an implementation scheme of carrier aggregation in the instrument was designed. At the same time, in order to meet the demand of new technologies in communication network as well as the requirements of users and base station equipment test, a scheme which could support multi-user in the case of multi-carrier and multi-cell for the analyzer was proposed. The application of the scheme, can accelerate the commercialization of carrier aggregation in communication network, speed up the network deployment process and shorten the network construction cycle, and it will play an indispensable role in communication network operation and maintenance.
Reference | Related Articles | Metrics
Design and implementation of PDSCH de-resource mapping in LTE-A air interface analyzer
WANG Meile, ZHANG Zhizhong, WANG Guangya
Journal of Computer Applications    2018, 38 (10): 2945-2949.   DOI: 10.11772/j.issn.1001-9081.2018030518
Abstract504)      PDF (762KB)(269)       Save
In view of the problem of computational redundancy due to the repeated computation of resource mapping positions in the traditional de-resource mapping method of Long Term Evolution-Advanced (LTE-A) physical layer, a new architecture of Physical Downlink Shared channel (PDSCH) de-resource mapping method was proposed, which provides support for the related physical layer processing of the LTE-A air interface analyzer. Firstly, before to the mapping of the physical downlink signal and the channel de-resource, the resource indexes of each signal and channel in single antenna port 0 mode, transmit diversity mode, single-stream beamforming, and dual-stream beamforming were generated; and then, the time-frequency location of the resource was directly located according to the resource index; finally, the PDSCH de-resource mapping module was put in the entire LTE-A link level simulation platform, and the simulations were given in four transmission modes, and the corresponding bit error rate and throughput comparison chart was obtained, which provides a theoretical reference to final hardware implementation. At the same time, compared with the de-resource mapping module under the traditional architecture, it shows that the de-resource mapping module under the new architecture costs 33.33% less time than the traditional computation mapping simulation, which reduces the de-resources and device resource consumption when de-resources mapping.
Reference | Related Articles | Metrics